Pseudo-Recursal: Solving the Catastrophic Forgetting Problem in Deep Neural Networks

نویسندگان

  • Craig Atkinson
  • Brendan McCane
  • Lech Szymanski
  • Anthony V. Robins
چکیده

In general, neural networks are not currently capable of learning tasks in a sequential fashion. When a novel, unrelated task is learnt by a neural network, it substantially forgets how to solve previously learnt tasks. One of the original solutions to this problem is pseudo-rehearsal, which involves learning the new task while rehearsing generated items representative of the previous task/s. This is very effective for simple tasks. However, pseudo-rehearsal has not yet been successfully applied to very complex tasks because in these tasks it is difficult to generate representative items. We accomplish pseudo-rehearsal by using a Generative Adversarial Network to generate items so that our deep network can learn to sequentially classify the CIFAR-10, SVHN and MNIST datasets. After training on all tasks, our network loses only 1.67% absolute accuracy on CIFAR-10 and gains 0.24% absolute accuracy on SVHN. Our model’s performance is a substantial improvement compared to the current state of the art solution.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Less-forgetting Learning in Deep Neural Networks

A catastrophic forgetting problem makes deep neural networks forget the previously learned information, when learning data collected in new environments, such as by different sensors or in different light conditions. This paper presents a new method for alleviating the catastrophic forgetting problem. Unlike previous research, our method does not use any information from the source domain. Surp...

متن کامل

Measuring Catastrophic Forgetting in Neural Networks

Deep neural networks are used in many state-of-the-art systems for machine perception. Once a network is trained to do a specific task, e.g., bird classification, it cannot easily be trained to do new tasks, e.g., incrementally learning to recognize additional bird species or learning an entirely different task such as flower recognition. When new tasks are added, typical deep neural networks a...

متن کامل

Catastrophic interference in connectionist networks

Introduction Catastrophic forgetting vs. normal forgetting Measures of catastrophic interference Solutions to the problem Rehearsal and pseudorehearsal Other techniques for alleviating catastrophic forgetting in neural networks Summary

متن کامل

Evolving Neural Networks That Suffer Minimal Catastrophic Forgetting

Catastrophic forgetting is a well-known failing of many neural network systems whereby training on new patterns causes them to forget previously learned patterns. Humans have evolved mechanisms to minimize this problem, and in this paper we present our preliminary attempts to use simulated evolution to generate neural networks that suffer significantly less from catastrophic forgetting than tra...

متن کامل

Mitigating Catastrophic Forgetting in Temporal Difference Learning with Function Approximation

Neural networks have had many great successes in recent years, particularly with the advent of deep learning and many novel training techniques. One issue that has prevented reinforcement learning from taking full advantage of scalable neural networks is that of catastrophic forgetting. The latter affects supervised learning systems when highly correlated input samples are presented, as well as...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1802.03875  شماره 

صفحات  -

تاریخ انتشار 2018